Exact and Inexact Subsampled Newton Methods for Optimization

نویسندگان

  • Raghu Bollapragada
  • Richard Byrd
چکیده

The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an inexact Newton method that solves linear systems approximately using the conjugate gradient (CG) method, and that samples the Hessian and not the gradient (the gradient is assumed to be exact). We provide a complexity analysis for this method based on the properties of the CG iteration and the quality of the Hessian approximation, and compare it with a method that employs a stochastic gradient iteration instead of the CG method. We report preliminary numerical results that illustrate the performance of inexact subsampled Newton methods on machine learning applications based on logistic regression.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Globally Convergent Inexact Newton Methods

Inexact Newton methods for finding a zero of F 1 1 are variations of Newton's method in which each step only approximately satisfies the linear Newton equation but still reduces the norm of the local linear model of F. Here, inexact Newton methods are formulated that incorporate features designed to improve convergence from arbitrary starting points. For each method, a basic global convergence ...

متن کامل

Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information

We consider variants of trust-region and cubic regularization methods for nonconvex optimization, in which the Hessian matrix is approximated. Under mild conditions on the inexact Hessian, and using approximate solution of the corresponding sub-problems, we provide iteration complexity to achieve ǫ-approximate second-order optimality which have shown to be tight. Our Hessian approximation condi...

متن کامل

Inexact semismooth Newton methods for large-scale complementarity problems

The semismooth Newton method is a nonsmooth Newton-type method applied to a suitable reformulation of the complementarity problem as a nonlinear and nonsmooth system of equations. It is one of the standard methods for solving these kind of problems, and it can be implemented in an inexact way so that all linear systems of equations have to be solved only inexactly. However, from a practical poi...

متن کامل

A nonmonotone inexact Newton method

In this paper we describe a variant of the Inexact Newton method for solving nonlinear systems of equations. We define a nonmonotone Inexact Newton step and a nonmonotone backtracking strategy. For this nonmonotone Inexact Newton scheme we present the convergence theorems. Finally, we show how we can apply these strategies to Inexact Newton Interior–Point method and we present some numerical ex...

متن کامل

An Investigation of Newton-Sketch and Subsampled Newton Methods

The concepts of sketching and subsampling have recently received much attention by the optimization and statistics communities. In this paper, we study NewtonSketch and Subsampled Newton (SSN) methods for the finite-sum optimization problem. We consider practical versions of the two methods in which the Newton equations are solved approximately using the conjugate gradient (CG) method or a stoc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016